A Heuristic Fast Gradient Descent Method for Unimodal Optimization
نویسندگان
چکیده
منابع مشابه
Fast gradient descent method for Mean-CVaR optimization
We propose an iterative gradient descent procedure for computing approximate solutions for the scenario-based mean-CVaR portfolio selection problem. This procedure is based on an algorithm proposed by Nesterov [13] for solving non-smooth convex optimization problems. Our procedure does not require any linear programming solver and in many cases the iterative steps can be solved in closed form. ...
متن کاملA New Descent Nonlinear Conjugate Gradient Method for Unconstrained Optimization
In this paper, a new nonlinear conjugate gradient method is proposed for large-scale unconstrained optimization. The sufficient descent property holds without any line searches. We use some steplength technique which ensures the Zoutendijk condition to be held, this method is proved to be globally convergent. Finally, we improve it, and do further analysis.
متن کاملA Gradient Descent Method for Optimization of Model Microvascular Networks
Within animals, oxygen exchange occurs within networks containing potentially billions of microvessels that are distributed throughout the animal’s body. Innovative imaging methods now allow for mapping of the architecture and blood flows within real microvascular networks. However, these data streams have so far yielded little new understanding of the physical principles that underlie the orga...
متن کاملA New Sufficient Descent Conjugate Gradient Method for Unconstrained Optimization
In this paper, a new conjugate conjugate method with sufficient descent property is proposed for the unconstrained optimization problem. An attractive property of the new method is that the descent direction generated by the method always possess the sufficient descent property, and this property is independent of the line search used and the choice of ki . Under mild conditions, the global c...
متن کاملA Gradient Descent Method for a Neural
| It has been demonstrated that higher order recurrent neu-ral networks exhibit an underlying fractal attractor as an artifact of their dynamics. These fractal attractors ooer a very eecent mechanism to encode visual memories in a neu-ral substrate, since even a simple twelve weight network can encode a very large set of diierent images. The main problem in this memory model, which so far has r...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Advances in Mathematics and Computer Science
سال: 2018
ISSN: 2456-9968
DOI: 10.9734/jamcs/2018/39798